Globally convergent modified Perry's conjugate gradient method
نویسندگان
چکیده
Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. In this paper, we propose a new conjugate gradient method which is based on the MBFGS secant condition by modifying Perry’s method. Our proposed method ensures sufficient descent independent of the accuracy of the line search and it is globally convergent under some assumptions. Numerical experiments are also presented. 2012 Elsevier Inc. All rights reserved.
منابع مشابه
Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property
Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...
متن کاملتوسیعی از یک روش گرادیان مزدوج سه بخشی مبتنی بر مقادیر تابع هدف با تضمین همگرایی بدون فرض تحدب
With respect to importance of the conjugate gradient methods for large-scale optimization, in this study a descent three-term conjugate gradient method is proposed based on an extended modified secant condition. In the proposed method, objective function values are used in addition to the gradient information. Also, it is established that the method is globally convergent without convexity assu...
متن کاملGlobal Convergence of a Modified Liu-storey Conjugate Gradient Method
In this paper, we make a modification to the LS conjugate gradient method and propose a descent LS method. The method can generates sufficient descent direction for the objective function. We prove that the method is globally convergent with an Armijo-type line search. Moreover, under mild conditions, we show that the method is globally convergent if the Armijo line search or the Wolfe line sea...
متن کاملA Modified Conjugate Gradient Method for Unconstrained Optimization
Conjugate gradient methods are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been studied in depth. In this paper, we further study the conjugate gradient method for unconstrained optimization. We focus our attention to the descent conjugate gradient method. This paper presents a modified conjugate gradien...
متن کاملTwo new conjugate gradient methods based on modified secant equations
Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposedmethods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their acco...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Applied Mathematics and Computation
دوره 218 شماره
صفحات -
تاریخ انتشار 2012